3,968 research outputs found

    On Optimization Modulo Theories, MaxSMT and Sorting Networks

    Full text link
    Optimization Modulo Theories (OMT) is an extension of SMT which allows for finding models that optimize given objectives. (Partial weighted) MaxSMT --or equivalently OMT with Pseudo-Boolean objective functions, OMT+PB-- is a very-relevant strict subcase of OMT. We classify existing approaches for MaxSMT or OMT+PB in two groups: MaxSAT-based approaches exploit the efficiency of state-of-the-art MAXSAT solvers, but they are specific-purpose and not always applicable; OMT-based approaches are general-purpose, but they suffer from intrinsic inefficiencies on MaxSMT/OMT+PB problems. We identify a major source of such inefficiencies, and we address it by enhancing OMT by means of bidirectional sorting networks. We implemented this idea on top of the OptiMathSAT OMT solver. We run an extensive empirical evaluation on a variety of problems, comparing MaxSAT-based and OMT-based techniques, with and without sorting networks, implemented on top of OptiMathSAT and {\nu}Z. The results support the effectiveness of this idea, and provide interesting insights about the different approaches.Comment: 17 pages, submitted at Tacas 1

    Fostering research and innovation in materials manufacturing for Industry 5.0: The key role of domain intertwining between materials characterization, modelling and data science

    Get PDF
    Recent advances in materials modelling, characterization and materials informatics suggest that deep integration of such methods can be a crucial aspect of the Industry 5.0 revolution, where the fourth industrial revolution paradigms are combined with the concepts of transition to a sustainable, human-centric and resilient industry. We pose a specific deep integration challenge beyond the ordinary multi-disciplinary modelling/characterization research approach in this short communication with research and innovation as drivers for scientific excellence. Full integration can be achieved by developing com-mon ontologies across different domains, enabling meaningful computational and experimental data integration and interoperability. On this basis, fine-tuning of adaptive materials modelling/characteriza-tion protocols can be achieved and facilitate computational and experimental efforts. Such interoperable and meaningful data combined with advanced data science tools (including machine learning and artifi-cial intelligence) become a powerful asset for materials scientists to extract complex information from the large amount of data generated by last generation characterization techniques. To achieve this ambi-tious goal, significant collaborative actions are needed to develop common, usable, and sharable digital tools that allow for effective and efficient twinning of data and workflows across the different materials modelling and characterization domains.(c) 2022 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http:// creativecommons.org/licenses/by-nc-nd/4.0/)

    Enhancing Sensitivity Classification with Semantic Features using Word Embeddings

    Get PDF
    Government documents must be reviewed to identify any sensitive information they may contain, before they can be released to the public. However, traditional paper-based sensitivity review processes are not practical for reviewing born-digital documents. Therefore, there is a timely need for automatic sensitivity classification techniques, to assist the digital sensitivity review process. However, sensitivity is typically a product of the relations between combinations of terms, such as who said what about whom, therefore, automatic sensitivity classification is a difficult task. Vector representations of terms, such as word embeddings, have been shown to be effective at encoding latent term features that preserve semantic relations between terms, which can also be beneficial to sensitivity classification. In this work, we present a thorough evaluation of the effectiveness of semantic word embedding features, along with term and grammatical features, for sensitivity classification. On a test collection of government documents containing real sensitivities, we show that extending text classification with semantic features and additional term n-grams results in significant improvements in classification effectiveness, correctly classifying 9.99% more sensitive documents compared to the text classification baseline

    Personalized Medicine and Machine Learning: A Roadmap for the Future

    Get PDF
    : In the last ten years, many advances have been made in the treatment and diagnosis of immune-mediated diseases [...]

    Diagnosis, Clinical Features and Management of Interstitial Lung Diseases in Rheumatic Disorders: Still a Long Journey

    Get PDF
    : Interstitial lung disease (ILD) is one of the most frequent pulmonary complications of autoimmune rheumatic diseases (ARDs), and it is mainly associated with connective tissue diseases (CTDs) and rheumatoid arthritis (RA) [...]

    From LTL and Limit-Deterministic B\"uchi Automata to Deterministic Parity Automata

    Full text link
    Controller synthesis for general linear temporal logic (LTL) objectives is a challenging task. The standard approach involves translating the LTL objective into a deterministic parity automaton (DPA) by means of the Safra-Piterman construction. One of the challenges is the size of the DPA, which often grows very fast in practice, and can reach double exponential size in the length of the LTL formula. In this paper we describe a single exponential translation from limit-deterministic B\"uchi automata (LDBA) to DPA, and show that it can be concatenated with a recent efficient translation from LTL to LDBA to yield a double exponential, \enquote{Safraless} LTL-to-DPA construction. We also report on an implementation, a comparison with the SPOT library, and performance on several sets of formulas, including instances from the 2016 SyntComp competition

    Insights from the classical MD simulations

    Get PDF
    Salt bridges and ionic interactions play an important role in protein stability, protein-protein interactions, and protein folding. Here, we provide the classical MD simulations of the structure and IR signatures of the arginine (Arg)–glutamate (Glu) salt bridge. The Arg-Glu model is based on the infinite polyalanine antiparallel two-stranded β-sheet structure. The 1 μs NPT simulations show that it preferably exists as a salt bridge (a contact ion pair). Bidentate (the end-on and side-on structures) and monodentate (the backside structure) configurations are localized [Donald et al., Proteins 79, 898–915 (2011)]. These structures are stabilized by the short +N–H⋯O− bonds. Their relative stability depends on a force field used in the MD simulations. The side-on structure is the most stable in terms of the OPLS-AA force field. If AMBER ff99SB-ILDN is used, the backside structure is the most stable. Compared with experimental data, simulations using the OPLS all-atom (OPLS-AA) force field describe the stability of the salt bridge structures quite realistically. It decreases in the following order: side-on > end-on > backside. The most stable side-on structure lives several nanoseconds. The less stable backside structure exists a few tenth of a nanosecond. Several short-living species (solvent shared, completely separately solvated ionic groups ion pairs, etc.) are also localized. Their lifetime is a few tens of picoseconds or less. Conformational flexibility of amino acids forming the salt bridge is investigated. The spectral signature of the Arg-Glu salt bridge is the IR-intensive band around 2200 cm−1. It is caused by the asymmetric stretching vibrations of the +N–H⋯O− fragment. Result of the present paper suggests that infrared spectroscopy in the 2000–2800 frequency region may be a rapid and quantitative method for the study of salt bridges in peptides and ionic interactions between proteins. This region is usually not considered in spectroscopic studies of peptides and proteins

    Pirfenidone for the treatment of interstitial lung disease associated to rheumatoid arthritis: a new scenario is coming?

    Get PDF
    Introduction: Interstitial lung disease (ILD) is a frequent extra-articular manifestation of Rheumatoid arthritis (RA), but nowadays there are no randomized controlled clinical trials to support therapeutic guidelines. RA-ILD, especially with UIP pattern, shares some similarities with idiopathic pulmonary fibrosis, suggesting a possible role of antifibrotic therapy in these patients. To date, there are no published data supporting the use of pifenidone in RA-ILD. We describe for the first time two patients with a diagnosis of RA-ILD successfully treated with hydroxychloroquine and pirfenidone, without adverse events. Case presentation: Patient 1 and patient 2 were first diagnosed with IPF (UIP pattern at high-resolution computed tomography, no other signs or symptoms suggesting other forms of ILD, routine laboratory examinations and immunological texts negative). Patients started pirfenidone 2403 mg daily. Few months later, they referred to our multidisciplinary outpatient for arthritis. ACPA and RF were positive. A diagnosis of RA was performed and treatment with corticosteroids and hydroxychloroquine was started, in association with pirfenidone. In both cases we assessed the stabilization of articular and lung manifestations, without adverse events. Discussion: In absence of randomized controlled trials, the optimal treatment of RA-ILD has not been determined and remains challenging. When considering therapeutic options for RA-ILD, both pulmonary and extra-thoracic disease manifestations and degrees of activity should be assessed and taken into consideration. Future prospective research might change RA-ILD management, moving to a more personalized approach based on the identification of different phenotypes of the disease or to a combination of immunosuppressive and antifibrotic treatment
    corecore